Key Findings
This analysis draws on 2,046 CalFresh applications submitted online in San Diego County. It includes applicant information, activity on the site, and final approval outcomes — but does not capture actions taken outside the platform, like mailed documents or phone interviews.
1. Factors Associated With Approval
To understand what predicts CalFresh approval, I fit a logistic regression model using application data from GetCalFresh.org. The model estimates how each factor — like income, document uploads, and interview completion — is associated with the chance of being approved, while accounting for all other variables in the dataset.
Variables were selected based on program relevance, user behavior, and insights from exploratory analysis. I scaled income per $500 for interpretability and recoded interview completion to retain applicants with missing responses.
The goal wasn’t just to predict outcomes, but to understand where applicants might drop off and which steps in the process matter most.
Key Findings:
- Applicants who confirmed completing the interview had a predicted approval rate of 72%, compared to 50% for those who didn’t confirm.
- Uploading documents with the application increased approval chances.
- Higher income reduced the likelihood of approval — even among mostly income-eligible applicants.
- Having children in the household was associated with higher approval odds.
- Housing stability and application time didn’t show strong associations once other factors were considered.
The table below shows model results using odds ratios — a way to estimate how each factor affects the odds of approval, controlling for all others. For example, an odds ratio of 1.5 means the odds of approval are 50% higher for that group compared to the baseline.
The model explained a meaningful amount of variation in approval outcomes (McFadden R² = 0.15) and correctly distinguished approved vs. denied applications 76% of the time (AUC = 0.76). These results suggest the model performs well given the limited behavioral data available from the application platform.
To make the results easier to interpret, the table below shows predicted approval rates for example applicant scenarios, based on the fitted model.
These results show that relatively small steps — like completing an interview or uploading a document — can meaningfully increase the chance of approval. Many of these steps could be supported through timely nudges, simpler workflows, or user-centered reminders.
2. Potential Improvements
The model points to clear opportunities to improve approval outcomes by supporting applicants at key decision points.
Recommendations:
- Support interview completion: Many eligible applicants do not confirm completing the interview. Providing reminders, real-time scheduling, or alternative follow-up methods could increase follow-through.
- Encourage early document uploads: Uploading documents with the application was strongly associated with approval. Nudging users to upload early — especially those likely to qualify — could reduce denials.
- Address geographic disparities: Approval rates vary significantly by ZIP code. Further analysis could explore whether these reflect staffing, broadband access, or population needs — and help inform place-based outreach strategies.
Strengthening these steps would not only increase approval rates, but also improve equity and access for those most in need.